Goto

Collaborating Authors

 Search


Appendix A Performance on real-world based instances

Neural Information Processing Systems

We further evaluate SGBS+EAS on nine real-world based instance sets from [15]. Each instance set consists of 20 instances that have similar characteristics (i.e., they have been sampled from the same underlying distribution). The instance sets differ significantly in terms of several structural properties, for example, the number of customers n and their position (e.g., clustered vs. random positions). A more detailed description of instance sets can be found in [15]. One major advantage of neural combinatorial optimization approaches over traditional handcrafted optimization methods is their ability to quickly learn customized heuristics for new problem settings.


Heuristic Domain Adaptation A Appendix A.1 Main Differences Table 1: Main differences between heuristic search and HDA

Neural Information Processing Systems

Heuristic search is devised for the task of path search problems, while heuristic domain adaptation (HDA) is implemented on the task of domain adaptation. The different tasks result in different properties between heuristic search and HDA. We summarize the main different properties as shown in Table 1. Heuristic search aims to obtain a least-cost path to the destination. While the goal of HDA is to obtain the domain-invariant representations, in a goal-oriented manner.


Generative Hierarchical Materials Search

Neural Information Processing Systems

Generative models trained at scale can now produce text, video, and more recently, scientific data such as crystal structures. In applications of generative approaches to materials science, and in particular to crystal structures, the guidance from the domain expert in the form of high-level instructions can be essential for an automated system to output candidate crystals that are viable for downstream research. In this work, we formulate end-to-end language-to-structure generation as a multi-objective optimization problem, and propose Generative Hierarchical Materials Search (GenMS) for controllable generation of crystal structures. GenMS consists of (1) a language model that takes high-level natural language as input and generates intermediate textual information about a crystal (e.g., chemical formulae), and (2) a diffusion model that takes intermediate information as input and generates low-level continuous value crystal structures. GenMS additionally uses a graph neural network to predict properties (e.g., formation energy) from the generated crystal structures. During inference, GenMS leverages all three components to conduct a forward tree search over the space of possible structures. Experiments show that GenMS outperforms other alternatives of directly using language models to generate structures both in satisfying user request and in generating low-energy structures. We confirm that GenMS is able to generate common crystal structures such as double perovskites, or spinels, solely from natural language input, and hence can form the foundation for more complex structure generation in near future.


Bayesian Optimization of Functions over Node Subsets in Graphs

Neural Information Processing Systems

We address the problem of optimizing over functions defined on node subsets in a graph. The optimization of such functions is often a non-trivial task given their combinatorial, black-box and expensive-to-evaluate nature. Although various algorithms have been introduced in the literature, most are either task-specific or computationally inefficient and only utilize information about the graph structure without considering the characteristics of the function. To address these limitations, we utilize Bayesian Optimization (BO), a sample-efficient black-box solver, and propose a novel framework for combinatorial optimization on graphs. More specifically, we map each k-node subset in the original graph to a node in a new combinatorial graph and adopt a local modeling approach to efficiently traverse the latter graph by progressively sampling its subgraphs using a recursive algorithm. Extensive experiments under both synthetic and real-world setups demonstrate the effectiveness of the proposed BO framework on various types of graphs and optimization tasks, where its behavior is analyzed in detail with ablation studies.



A Beam Search Algorithm

Neural Information Processing Systems

Algorithm 1 demonstrates the step-by-step operations of our beam search algorithm (see Sec. 4.3). B.1 Preprocessing details for WikiText-103 and Enron Email We consider recovering sentences in the current work. As WikiText-103 and Enron Email have longer context (i.e. We leave recovering longer paragraphs as future work. We keep 2000 examples of each dataset as the evaluation set, and use the left for training.


Generalization Error Bounds for Two-stage Recommender Systems with Tree Structure

Neural Information Processing Systems

Two-stage recommender systems play a crucial role in efficiently identifying relevant items and personalizing recommendations from a vast array of options. This paper, based on an error decomposition framework, analyzes the generalization error for two-stage recommender systems with a tree structure, which consist of an efficient tree-based retriever and a more precise yet time-consuming ranker. We use the Rademacher complexity to establish the generalization upper bound for various tree-based retrievers using beam search, as well as for different ranker models under a shifted training distribution. Both theoretical insights and practical experiments on real-world datasets indicate that increasing the branches in tree-based retrievers and harmonizing distributions across stages can enhance the generalization performance of two-stage recommender systems.


Robust and Faster Zeroth-Order Minimax Optimization: Complexity and Applications Weixin An1

Neural Information Processing Systems

Many zeroth-order (ZO) optimization algorithms have been developed to solve nonconvex minimax problems in machine learning and computer vision areas. However, existing ZO minimax algorithms have high complexity and rely on some strict restrictive conditions for ZO estimations.


Erdős Goes Neural: an Unsupervised Learning Framework for Combinatorial Optimization on Graphs

Neural Information Processing Systems

Combinatorial optimization (CO) problems are notoriously challenging for neural networks, especially in the absence of labeled instances. This work proposes an unsupervised learning framework for CO problems on graphs that can provide integral solutions of certified quality. Inspired by Erdős' probabilistic method, we use a neural network to parametrize a probability distribution over sets. Crucially, we show that when the network is optimized w.r.t. a suitably chosen loss, the learned distribution contains, with controlled probability, a low-cost integral solution that obeys the constraints of the combinatorial problem. The probabilistic proof of existence is then derandomized to decode the desired solutions. We demonstrate the efficacy of this approach to obtain valid solutions to the maximum clique problem and to perform local graph clustering. Our method achieves competitive results on both real datasets and synthetic hard instances.


Fair Clustering Under a Bounded Cost

Neural Information Processing Systems

Clustering is a fundamental unsupervised learning problem where a dataset is partitioned into clusters that consist of nearby points in a metric space. A recent variant, fair clustering, associates a color with each point representing its group membership and requires that each color has (approximately) equal representation in each cluster to satisfy group fairness.